Skip to content

Conversation

@julien-c
Copy link
Member

@julien-c julien-c commented Jun 3, 2025

The canonical sequence of tool streaming chunks according to OpenAI documentation looks like this:

[{"index": 0, "id": "call_DdmO9pD3xa9XTPNJ32zg2hcA", "function": {"arguments": "", "name": "get_weather"}, "type": "function"}]
[{"index": 0, "id": null, "function": {"arguments": "{\"", "name": null}, "type": null}]
[{"index": 0, "id": null, "function": {"arguments": "location", "name": null}, "type": null}]
[{"index": 0, "id": null, "function": {"arguments": "\":\"", "name": null}, "type": null}]
[{"index": 0, "id": null, "function": {"arguments": "Paris", "name": null}, "type": null}]
[{"index": 0, "id": null, "function": {"arguments": ",", "name": null}, "type": null}]
[{"index": 0, "id": null, "function": {"arguments": " France", "name": null}, "type": null}]
[{"index": 0, "id": null, "function": {"arguments": "\"}", "name": null}, "type": null}]
null

In ollama's case (tested on llama3.2:3b at least) the sequence of chunks is different, the first chunk can already contain the arguments string.

<Tool call_uiutvfya>
task_complete {}

This should close #1502

Copy link
Contributor

@Wauplin Wauplin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good 👍

@julien-c julien-c merged commit 49d93f2 into main Jun 4, 2025
5 checks passed
@julien-c julien-c deleted the ollama-mcp-client-fix branch June 4, 2025 09:41
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

mcp-client JSON.parse error for local ollama

4 participants